Console Output

Training and evaluating model for: Coffee Machine
Dataset length: 35804 windows


 NILMModel(
  (conv1d): Conv1d(9, 9, kernel_size=(3,), stride=(1,), padding=(1,))
  (lstm): LSTM(9, 256, num_layers=3, batch_first=True, dropout=0.1)
  (dropout): Dropout(p=0.1, inplace=False)
  (relu): ReLU()
  (output_layer): Linear(in_features=256, out_features=1, bias=True)
)
Epoch [1/300], Train Loss: 0.000995
Validation Loss: 0.001045
Epoch [2/300], Train Loss: 0.000976
Validation Loss: 0.001040
Epoch [3/300], Train Loss: 0.000961
Validation Loss: 0.001000
Epoch [4/300], Train Loss: 0.000887
Validation Loss: 0.000839
Epoch [5/300], Train Loss: 0.000751
Validation Loss: 0.000711
Epoch [6/300], Train Loss: 0.000669
Validation Loss: 0.000634
Epoch [7/300], Train Loss: 0.000616
Validation Loss: 0.000616
Epoch [8/300], Train Loss: 0.000578
Validation Loss: 0.000548
Epoch [9/300], Train Loss: 0.000548
Validation Loss: 0.000518
Epoch [10/300], Train Loss: 0.000517
Validation Loss: 0.000511
Epoch [11/300], Train Loss: 0.000496
Validation Loss: 0.000471
Epoch [12/300], Train Loss: 0.000479
Validation Loss: 0.000447
Epoch [13/300], Train Loss: 0.000445
Validation Loss: 0.000431
Epoch [14/300], Train Loss: 0.000406
Validation Loss: 0.000377
Epoch [15/300], Train Loss: 0.000361
Validation Loss: 0.000327
Epoch [16/300], Train Loss: 0.000362
Validation Loss: 0.000340
Epoch [17/300], Train Loss: 0.000328
Validation Loss: 0.000322
Epoch [18/300], Train Loss: 0.000308
Validation Loss: 0.000291
Epoch [19/300], Train Loss: 0.000398
Validation Loss: 0.000325
Epoch [20/300], Train Loss: 0.000335
Validation Loss: 0.000319
Epoch [21/300], Train Loss: 0.000305
Validation Loss: 0.000292
Epoch [22/300], Train Loss: 0.000282
Validation Loss: 0.000288
Epoch [23/300], Train Loss: 0.000271
Validation Loss: 0.000295
Epoch [24/300], Train Loss: 0.000265
Validation Loss: 0.000263
Epoch [25/300], Train Loss: 0.000250
Validation Loss: 0.000237
Epoch [26/300], Train Loss: 0.000251
Validation Loss: 0.000259
Epoch [27/300], Train Loss: 0.000247
Validation Loss: 0.000244
Epoch [28/300], Train Loss: 0.000230
Validation Loss: 0.000226
Epoch [29/300], Train Loss: 0.000224
Validation Loss: 0.000212
Epoch [30/300], Train Loss: 0.000224
Validation Loss: 0.000210
Epoch [31/300], Train Loss: 0.000212
Validation Loss: 0.000200
Epoch [32/300], Train Loss: 0.000212
Validation Loss: 0.000209
Epoch [33/300], Train Loss: 0.000205
Validation Loss: 0.000192
Epoch [34/300], Train Loss: 0.000198
Validation Loss: 0.000188
Epoch [35/300], Train Loss: 0.000202
Validation Loss: 0.000188
Epoch [36/300], Train Loss: 0.000192
Validation Loss: 0.000241
Epoch [37/300], Train Loss: 0.000192
Validation Loss: 0.000176
Epoch [38/300], Train Loss: 0.000186
Validation Loss: 0.000181
Epoch [39/300], Train Loss: 0.000180
Validation Loss: 0.000168
Epoch [40/300], Train Loss: 0.000176
Validation Loss: 0.000176
Epoch [41/300], Train Loss: 0.000173
Validation Loss: 0.000160
Epoch [42/300], Train Loss: 0.000170
Validation Loss: 0.000182
Epoch [43/300], Train Loss: 0.000173
Validation Loss: 0.000156
Epoch [44/300], Train Loss: 0.000165
Validation Loss: 0.000153
Epoch [45/300], Train Loss: 0.000160
Validation Loss: 0.000149
Epoch [46/300], Train Loss: 0.000155
Validation Loss: 0.000170
Epoch [47/300], Train Loss: 0.000157
Validation Loss: 0.000148
Epoch [48/300], Train Loss: 0.000149
Validation Loss: 0.000140
Epoch [49/300], Train Loss: 0.000157
Validation Loss: 0.000144
Epoch [50/300], Train Loss: 0.000147
Validation Loss: 0.000136
Epoch [51/300], Train Loss: 0.000143
Validation Loss: 0.000132
Epoch [52/300], Train Loss: 0.000144
Validation Loss: 0.000133
Epoch [53/300], Train Loss: 0.000141
Validation Loss: 0.000129
Epoch [54/300], Train Loss: 0.000137
Validation Loss: 0.000129
Epoch [55/300], Train Loss: 0.000133
Validation Loss: 0.000129
Epoch [56/300], Train Loss: 0.000139
Validation Loss: 0.000162
Epoch [57/300], Train Loss: 0.000139
Validation Loss: 0.000124
Epoch [58/300], Train Loss: 0.000128
Validation Loss: 0.000120
Epoch [59/300], Train Loss: 0.000143
Validation Loss: 0.000127
Epoch [60/300], Train Loss: 0.000126
Validation Loss: 0.000117
Epoch [61/300], Train Loss: 0.000121
Validation Loss: 0.000118
Epoch [62/300], Train Loss: 0.000128
Validation Loss: 0.000118
Epoch [63/300], Train Loss: 0.000120
Validation Loss: 0.000115
Epoch [64/300], Train Loss: 0.000117
Validation Loss: 0.000110
Epoch [65/300], Train Loss: 0.000115
Validation Loss: 0.000108
Epoch [66/300], Train Loss: 0.000112
Validation Loss: 0.000108
Epoch [67/300], Train Loss: 0.000115
Validation Loss: 0.000109
Epoch [68/300], Train Loss: 0.000109
Validation Loss: 0.000103
Epoch [69/300], Train Loss: 0.000107
Validation Loss: 0.000109
Epoch [70/300], Train Loss: 0.000105
Validation Loss: 0.000100
Epoch [71/300], Train Loss: 0.000105
Validation Loss: 0.000104
Epoch [72/300], Train Loss: 0.000103
Validation Loss: 0.000096
Epoch [73/300], Train Loss: 0.000101
Validation Loss: 0.000097
Epoch [74/300], Train Loss: 0.000099
Validation Loss: 0.000094
Epoch [75/300], Train Loss: 0.000103
Validation Loss: 0.000097
Epoch [76/300], Train Loss: 0.000097
Validation Loss: 0.000097
Epoch [77/300], Train Loss: 0.000099
Validation Loss: 0.000097
Epoch [78/300], Train Loss: 0.000095
Validation Loss: 0.000091
Epoch [79/300], Train Loss: 0.000093
Validation Loss: 0.000090
Epoch [80/300], Train Loss: 0.000091
Validation Loss: 0.000088
Epoch [81/300], Train Loss: 0.000090
Validation Loss: 0.000093
Epoch [82/300], Train Loss: 0.000088
Validation Loss: 0.000089
Epoch [83/300], Train Loss: 0.000089
Validation Loss: 0.000086
Epoch [84/300], Train Loss: 0.000086
Validation Loss: 0.000089
Epoch [85/300], Train Loss: 0.000084
Validation Loss: 0.000084
Epoch [86/300], Train Loss: 0.000083
Validation Loss: 0.000083
Epoch [87/300], Train Loss: 0.000087
Validation Loss: 0.000086
Epoch [88/300], Train Loss: 0.000083
Validation Loss: 0.000081
Epoch [89/300], Train Loss: 0.000080
Validation Loss: 0.000084
Epoch [90/300], Train Loss: 0.000080
Validation Loss: 0.000080
Epoch [91/300], Train Loss: 0.000079
Validation Loss: 0.000077
Epoch [92/300], Train Loss: 0.000077
Validation Loss: 0.000078
Epoch [93/300], Train Loss: 0.000077
Validation Loss: 0.000081
Epoch [94/300], Train Loss: 0.000074
Validation Loss: 0.000082
Epoch [95/300], Train Loss: 0.000074
Validation Loss: 0.000077
Epoch [96/300], Train Loss: 0.000072
Validation Loss: 0.000075
Epoch [97/300], Train Loss: 0.000071
Validation Loss: 0.000074
Epoch [98/300], Train Loss: 0.000071
Validation Loss: 0.000073
Epoch [99/300], Train Loss: 0.000093
Validation Loss: 0.000078
Epoch [100/300], Train Loss: 0.000072
Validation Loss: 0.000074
Epoch [101/300], Train Loss: 0.000068
Validation Loss: 0.000074
Epoch [102/300], Train Loss: 0.000067
Validation Loss: 0.000074
Epoch [103/300], Train Loss: 0.000067
Validation Loss: 0.000072
Epoch [104/300], Train Loss: 0.000071
Validation Loss: 0.000073
Epoch [105/300], Train Loss: 0.000065
Validation Loss: 0.000070
Epoch [106/300], Train Loss: 0.000066
Validation Loss: 0.000071
Epoch [107/300], Train Loss: 0.000064
Validation Loss: 0.000070
Epoch [108/300], Train Loss: 0.000063
Validation Loss: 0.000070
Epoch [109/300], Train Loss: 0.000068
Validation Loss: 0.000068
Epoch [110/300], Train Loss: 0.000063
Validation Loss: 0.000072
Epoch [111/300], Train Loss: 0.000064
Validation Loss: 0.000067
Epoch [112/300], Train Loss: 0.000062
Validation Loss: 0.000070
Epoch [113/300], Train Loss: 0.000061
Validation Loss: 0.000069
Epoch [114/300], Train Loss: 0.000060
Validation Loss: 0.000068
Epoch [115/300], Train Loss: 0.000061
Validation Loss: 0.000068
Epoch [116/300], Train Loss: 0.000059
Validation Loss: 0.000066
Epoch [117/300], Train Loss: 0.000059
Validation Loss: 0.000065
Epoch [118/300], Train Loss: 0.000060
Validation Loss: 0.000103
Epoch [119/300], Train Loss: 0.000071
Validation Loss: 0.000065
Epoch [120/300], Train Loss: 0.000058
Validation Loss: 0.000065
Epoch [121/300], Train Loss: 0.000057
Validation Loss: 0.000065
Epoch [122/300], Train Loss: 0.000057
Validation Loss: 0.000066
Epoch [123/300], Train Loss: 0.000056
Validation Loss: 0.000065
Epoch [124/300], Train Loss: 0.000055
Validation Loss: 0.000064
Epoch [125/300], Train Loss: 0.000059
Validation Loss: 0.000065
Epoch [126/300], Train Loss: 0.000055
Validation Loss: 0.000065
Epoch [127/300], Train Loss: 0.000055
Validation Loss: 0.000063
Epoch [128/300], Train Loss: 0.000054
Validation Loss: 0.000063
Epoch [129/300], Train Loss: 0.000054
Validation Loss: 0.000063
Epoch [130/300], Train Loss: 0.000054
Validation Loss: 0.000065
Epoch [131/300], Train Loss: 0.000054
Validation Loss: 0.000062
Epoch [132/300], Train Loss: 0.000054
Validation Loss: 0.000063
Epoch [133/300], Train Loss: 0.000052
Validation Loss: 0.000065
Epoch [134/300], Train Loss: 0.000057
Validation Loss: 0.000070
Epoch [135/300], Train Loss: 0.000056
Validation Loss: 0.000064
Epoch [136/300], Train Loss: 0.000053
Validation Loss: 0.000062
Epoch [137/300], Train Loss: 0.000052
Validation Loss: 0.000063
Epoch [138/300], Train Loss: 0.000051
Validation Loss: 0.000061
Epoch [139/300], Train Loss: 0.000051
Validation Loss: 0.000063
Epoch [140/300], Train Loss: 0.000052
Validation Loss: 0.000060
Epoch [141/300], Train Loss: 0.000050
Validation Loss: 0.000061
Epoch [142/300], Train Loss: 0.000050
Validation Loss: 0.000060
Epoch [143/300], Train Loss: 0.000051
Validation Loss: 0.000060
Epoch [144/300], Train Loss: 0.000050
Validation Loss: 0.000062
Epoch [145/300], Train Loss: 0.000050
Validation Loss: 0.000063
Epoch [146/300], Train Loss: 0.000049
Validation Loss: 0.000060
Epoch [147/300], Train Loss: 0.000049
Validation Loss: 0.000060
Epoch [148/300], Train Loss: 0.000049
Validation Loss: 0.000060
Epoch [149/300], Train Loss: 0.000048
Validation Loss: 0.000059
Epoch [150/300], Train Loss: 0.000048
Validation Loss: 0.000058
Epoch [151/300], Train Loss: 0.000048
Validation Loss: 0.000059
Epoch [152/300], Train Loss: 0.000048
Validation Loss: 0.000058
Epoch [153/300], Train Loss: 0.000047
Validation Loss: 0.000061
Epoch [154/300], Train Loss: 0.000048
Validation Loss: 0.000058
Epoch [155/300], Train Loss: 0.000048
Validation Loss: 0.000058
Epoch [156/300], Train Loss: 0.000047
Validation Loss: 0.000057
Epoch [157/300], Train Loss: 0.000046
Validation Loss: 0.000058
Epoch [158/300], Train Loss: 0.000050
Validation Loss: 0.000058
Epoch [159/300], Train Loss: 0.000046
Validation Loss: 0.000057
Epoch [160/300], Train Loss: 0.000045
Validation Loss: 0.000058
Epoch [161/300], Train Loss: 0.000045
Validation Loss: 0.000057
Epoch [162/300], Train Loss: 0.000045
Validation Loss: 0.000057
Epoch [163/300], Train Loss: 0.000045
Validation Loss: 0.000057
Epoch [164/300], Train Loss: 0.000048
Validation Loss: 0.000057
Epoch [165/300], Train Loss: 0.000045
Validation Loss: 0.000056
Epoch [166/300], Train Loss: 0.000044
Validation Loss: 0.000056
Epoch [167/300], Train Loss: 0.000044
Validation Loss: 0.000056
Epoch [168/300], Train Loss: 0.000043
Validation Loss: 0.000055
Epoch [169/300], Train Loss: 0.000044
Validation Loss: 0.000057
Epoch [170/300], Train Loss: 0.000044
Validation Loss: 0.000057
Epoch [171/300], Train Loss: 0.000044
Validation Loss: 0.000058
Epoch [172/300], Train Loss: 0.000044
Validation Loss: 0.000059
Epoch [173/300], Train Loss: 0.000043
Validation Loss: 0.000056
Epoch [174/300], Train Loss: 0.000043
Validation Loss: 0.000056
Epoch [175/300], Train Loss: 0.000044
Validation Loss: 0.000056
Epoch [176/300], Train Loss: 0.000043
Validation Loss: 0.000054
Epoch [177/300], Train Loss: 0.000043
Validation Loss: 0.000056
Epoch [178/300], Train Loss: 0.000042
Validation Loss: 0.000056
Epoch [179/300], Train Loss: 0.000042
Validation Loss: 0.000054
Epoch [180/300], Train Loss: 0.000042
Validation Loss: 0.000055
Epoch [181/300], Train Loss: 0.000042
Validation Loss: 0.000054
Epoch [182/300], Train Loss: 0.000042
Validation Loss: 0.000053
Epoch [183/300], Train Loss: 0.000042
Validation Loss: 0.000054
Epoch [184/300], Train Loss: 0.000041
Validation Loss: 0.000053
Epoch [185/300], Train Loss: 0.000042
Validation Loss: 0.000053
Epoch [186/300], Train Loss: 0.000041
Validation Loss: 0.000052
Epoch [187/300], Train Loss: 0.000040
Validation Loss: 0.000052
Epoch [188/300], Train Loss: 0.000040
Validation Loss: 0.000052
Epoch [189/300], Train Loss: 0.000040
Validation Loss: 0.000052
Epoch [190/300], Train Loss: 0.000040
Validation Loss: 0.000054
Epoch [191/300], Train Loss: 0.000041
Validation Loss: 0.000057
Epoch [192/300], Train Loss: 0.000041
Validation Loss: 0.000050
Epoch [193/300], Train Loss: 0.000038
Validation Loss: 0.000049
Epoch [194/300], Train Loss: 0.000038
Validation Loss: 0.000049
Epoch [195/300], Train Loss: 0.000038
Validation Loss: 0.000048
Epoch [196/300], Train Loss: 0.000038
Validation Loss: 0.000053
Epoch [197/300], Train Loss: 0.000038
Validation Loss: 0.000048
Epoch [198/300], Train Loss: 0.000037
Validation Loss: 0.000048
Epoch [199/300], Train Loss: 0.000038
Validation Loss: 0.000048
Epoch [200/300], Train Loss: 0.000037
Validation Loss: 0.000049
Epoch [201/300], Train Loss: 0.000037
Validation Loss: 0.000048
Epoch [202/300], Train Loss: 0.000037
Validation Loss: 0.000048
Epoch [203/300], Train Loss: 0.000037
Validation Loss: 0.000048
Epoch [204/300], Train Loss: 0.000037
Validation Loss: 0.000047
Epoch [205/300], Train Loss: 0.000037
Validation Loss: 0.000048
Epoch [206/300], Train Loss: 0.000037
Validation Loss: 0.000047
Epoch [207/300], Train Loss: 0.000038
Validation Loss: 0.000047
Epoch [208/300], Train Loss: 0.000037
Validation Loss: 0.000047
Epoch [209/300], Train Loss: 0.000036
Validation Loss: 0.000047
Epoch [210/300], Train Loss: 0.000036
Validation Loss: 0.000048
Epoch [211/300], Train Loss: 0.000036
Validation Loss: 0.000046
Epoch [212/300], Train Loss: 0.000036
Validation Loss: 0.000046
Epoch [213/300], Train Loss: 0.000035
Validation Loss: 0.000047
Epoch [214/300], Train Loss: 0.000035
Validation Loss: 0.000047
Epoch [215/300], Train Loss: 0.000035
Validation Loss: 0.000046
Epoch [216/300], Train Loss: 0.000035
Validation Loss: 0.000047
Epoch [217/300], Train Loss: 0.000035
Validation Loss: 0.000048
Epoch [218/300], Train Loss: 0.000035
Validation Loss: 0.000046
Epoch [219/300], Train Loss: 0.000035
Validation Loss: 0.000047
Epoch [220/300], Train Loss: 0.000041
Validation Loss: 0.000046
Epoch [221/300], Train Loss: 0.000035
Validation Loss: 0.000046
Epoch [222/300], Train Loss: 0.000034
Validation Loss: 0.000046
Epoch [223/300], Train Loss: 0.000034
Validation Loss: 0.000046
Epoch [224/300], Train Loss: 0.000034
Validation Loss: 0.000045
Epoch [225/300], Train Loss: 0.000034
Validation Loss: 0.000046
Epoch [226/300], Train Loss: 0.000034
Validation Loss: 0.000046
Epoch [227/300], Train Loss: 0.000034
Validation Loss: 0.000045
Epoch [228/300], Train Loss: 0.000034
Validation Loss: 0.000047
Epoch [229/300], Train Loss: 0.000035
Validation Loss: 0.000046
Epoch [230/300], Train Loss: 0.000036
Validation Loss: 0.000047
Epoch [231/300], Train Loss: 0.000034
Validation Loss: 0.000046
Epoch [232/300], Train Loss: 0.000034
Validation Loss: 0.000047
Epoch [233/300], Train Loss: 0.000034
Validation Loss: 0.000045
Epoch [234/300], Train Loss: 0.000033
Validation Loss: 0.000045
Epoch [235/300], Train Loss: 0.000033
Validation Loss: 0.000047
Epoch [236/300], Train Loss: 0.000033
Validation Loss: 0.000045
Epoch [237/300], Train Loss: 0.000033
Validation Loss: 0.000044
Epoch [238/300], Train Loss: 0.000034
Validation Loss: 0.000046
Epoch [239/300], Train Loss: 0.000033
Validation Loss: 0.000044
Epoch [240/300], Train Loss: 0.000033
Validation Loss: 0.000044
Epoch [241/300], Train Loss: 0.000033
Validation Loss: 0.000044
Epoch [242/300], Train Loss: 0.000032
Validation Loss: 0.000044
Epoch [243/300], Train Loss: 0.000032
Validation Loss: 0.000044
Epoch [244/300], Train Loss: 0.000032
Validation Loss: 0.000043
Epoch [245/300], Train Loss: 0.000033
Validation Loss: 0.000044
Epoch [246/300], Train Loss: 0.000032
Validation Loss: 0.000044
Epoch [247/300], Train Loss: 0.000032
Validation Loss: 0.000045
Epoch [248/300], Train Loss: 0.000032
Validation Loss: 0.000045
Epoch [249/300], Train Loss: 0.000032
Validation Loss: 0.000044
Epoch [250/300], Train Loss: 0.000032
Validation Loss: 0.000044
Epoch [251/300], Train Loss: 0.000032
Validation Loss: 0.000044
Epoch [252/300], Train Loss: 0.000032
Validation Loss: 0.000045
Epoch [253/300], Train Loss: 0.000032
Validation Loss: 0.000044
Epoch [254/300], Train Loss: 0.000031
Validation Loss: 0.000043
Early stopping triggered

Evaluating model for: Coffee Machine
Validation MAE: 1.144958 W
Validation MSE: 173.345825 W²
Validation RMSE: 13.166086 W
Signal Aggregate Error (SAE): 0.054896
Normalized Disaggregation Error (NDE): 0.211248

      

Training and Validation Loss

Training Loss Plot

Interactive Plot